Needed imports
In [31]:
# plotting
import holoviews as hv
%load_ext holoviews.ipython
hv.notebook_extension('bokeh')
# scientific python usage
import numpy as np
from numpy import exp,cos,sin,pi,tan,sqrt
from scipy.stats import norm
from __future__ import division
# dedist package
from dedist import dedist
In [32]:
hv.Curve(np.arange(10))*hv.Curve(-np.arange(10))+hv.Curve(np.sin(np.arange(0,2*pi,pi/5)))
Out[32]:
In [33]:
a = np.arange(10)
b = np.arange(20,30)
a*b[:,None]
Out[33]:
The dedist
package calculates decoding distributions using multivariate Gaussians. It needs to know the basic tuning curves for this, and a sufficiently dense (and probably periodic) sampling of the stimulus space. First we must define a tuning curve function of the following form.
def fun(x,x_,par):
''' Basic tuning curve function as used in the dedist package.
Parameters
----------
x : array
preferred values of neurons
x_ : array
stimulus values
par : array
Function parameters
Returns
-------
array
the response across neurons
'''
# do some thing
# return some array giving response of all neurons
So for example, a Gaussian tuning curve:
In [34]:
def fun(x,x_,par):
''' Gaussian tuning curve
Parameters
----------
x,x_ : arrays
Preferred and actual stimuli
par : array or list
Tuning curve parameters, should be as [a,s], with
a being the max response, and s**2 the variance.
Returns
-------
array
of form a*exp(-(x-x_)**2/(2*s**2))
'''
return par[0]*exp(-(x-x_)**2/(2*par[1]**2))
Then, we set some basic parameters
In [35]:
# number of neurons
N = 64
# gaussian noise sqrt(variance)
sigma = 1
# set of orientations
A = np.linspace(-pi,pi,N+1)[:-1]
# Stimulus sampling (at what stimuli the probability dist should be calculated)
# This needs to be at least as high as N. The stimulus sampling in the case of
# mirrored functions (such a function dependent on the absolute value of the stimulus)
# the stimulus samples should only include half of the possible inputs
ns = 64
As = np.linspace(-pi,pi,ns+1)[:-1]
# set parameters
par = [1,1]
First, let's simulate the system a set number of times normally
In [36]:
# number of realizations
reals = 100000
# set stimulus index (such that the stimulus value is As[stim])
stim = 32
# define noise
n = np.random.normal(scale=sqrt(sigma),size=(reals,N))
# find all possible responses across sample stimuli
F = fun(A,As[:,None],par)
# find real response and add noise
f = F[stim,:]
n = np.random.normal(scale=sigma,size=(reals,N))
r = f+n
# find total squared errors across possible stimuli
Errors = np.sum((r-F[:,None])**2,axis=2)
# find the minimum error stimuli for each noise realization
sol = As[Errors.argmin(axis=0)]
# plot decoding distribution histogram
dA = (As[1]-As[0])/2
freq, edges = np.histogram(sol, bins=As-dA)
hv.Histogram(freq/freq.sum(),edges)
Out[36]:
Next, let's sample some error profiles from the multivariate probability distribution that representents the behaviour of the above system.
In [37]:
# sample a 100 erorr profiles
sol_th,Errors,means,cov = dedist.sample_E(fun,As[stim],par,sigma,A,As,reals,True)
# get decoding distribution
freq, edges = np.histogram(sol_th, bins=As-dA)
hv.Histogram(freq/freq.sum(),edges)
Out[37]:
And finally let's do the same but by calculating the probability that a given stimulus is has the minimum error.
In [38]:
p = dedist.est_p(fun,As[stim],par,sigma,A,As,verbose=False)
hv.Histogram(freq/freq.sum(),edges)*hv.Curve(zip(As,p))
Out[38]:
In [ ]: